The LLaMA-based Gorilla model outperforms state-of-the-art LLMs like GPT-4 by mitigating hallucination issues and adapting to document changes. Gorilla has been trained on a massive dataset and has made its code, models, data, and demos available on GitHub. Future plans include adding more domains for Gorilla, such as Kubernetes, GCP, AWS, and OpenAPI.